Approximating smooth functions by deep neural networks with sigmoid activation function
نویسندگان
چکیده
We study the power of deep neural networks (DNNs) with sigmoid activation function. Recently, it was shown that DNNs approximate any d-dimensional, smooth function on a compact set rate order W?p?d, where W is number nonzero weights in network and p smoothness Unfortunately, these rates only hold for special class sparsely connected DNNs. ask ourselves if we can show same approximation simpler more general class, i.e., which are defined by its width depth. In this article fixed depth Md achieve an M?2p. As conclusion quantitatively characterize terms overall W0 W0?p?d. This result finally helps us to understand topology guarantees target accuracy.
منابع مشابه
Deep Neural Networks with Multistate Activation Functions
We propose multistate activation functions (MSAFs) for deep neural networks (DNNs). These MSAFs are new kinds of activation functions which are capable of representing more than two states, including the N-order MSAFs and the symmetrical MSAF. DNNs with these MSAFs can be trained via conventional Stochastic Gradient Descent (SGD) as well as mean-normalised SGD. We also discuss how these MSAFs p...
متن کاملNeural Networks with Smooth Adaptive Activation Functions for Regression
In Neural Networks (NN), Adaptive Activation Functions (AAF) have parameters that control the shapes of activation functions. These parameters are trained along with other parameters in the NN. AAFs have improved performance of Neural Networks (NN) in multiple classification tasks. In this paper, we propose and apply AAFs on feedforward NNs for regression tasks. We argue that applying AAFs in t...
متن کاملNonparametric regression using deep neural networks with ReLU activation function
Consider the multivariate nonparametric regression model. It is shown that estimators based on sparsely connected deep neural networks with ReLU activation function and properly chosen network architecture achieve the minimax rates of convergence (up to log n-factors) under a general composition assumption on the regression function. The framework includes many well-studied structural constrain...
متن کاملApproximation of smooth functions by neural networks
We review some aspects of our recent work on the approximation of functions by neural and generalized translation networks.
متن کاملApproximating Piecewise-Smooth Functions
We consider the possibility of using locally supported quasi-interpolation operators for the approximation of univariate non-smooth functions. In such a case one usually expects the rate of approximation to be lower than that of smooth functions. It is shown in this paper that prior knowledge of the type of ’singularity’ of the function can be used to regain the full approximation power of the ...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: Journal of Multivariate Analysis
سال: 2021
ISSN: ['0047-259X', '1095-7243']
DOI: https://doi.org/10.1016/j.jmva.2020.104696